8 research outputs found

    Redefining A in RGBA: Towards a Standard for Graphical 3D Printing

    Full text link
    Advances in multimaterial 3D printing have the potential to reproduce various visual appearance attributes of an object in addition to its shape. Since many existing 3D file formats encode color and translucency by RGBA textures mapped to 3D shapes, RGBA information is particularly important for practical applications. In contrast to color (encoded by RGB), which is specified by the object's reflectance, selected viewing conditions and a standard observer, translucency (encoded by A) is neither linked to any measurable physical nor perceptual quantity. Thus, reproducing translucency encoded by A is open for interpretation. In this paper, we propose a rigorous definition for A suitable for use in graphical 3D printing, which is independent of the 3D printing hardware and software, and which links both optical material properties and perceptual uniformity for human observers. By deriving our definition from the absorption and scattering coefficients of virtual homogeneous reference materials with an isotropic phase function, we achieve two important properties. First, a simple adjustment of A is possible, which preserves the translucency appearance if an object is re-scaled for printing. Second, determining the value of A for a real (potentially non-homogeneous) material, can be achieved by minimizing a distance function between light transport measurements of this material and simulated measurements of the reference materials. Such measurements can be conducted by commercial spectrophotometers used in graphic arts. Finally, we conduct visual experiments employing the method of constant stimuli, and derive from them an embedding of A into a nearly perceptually uniform scale of translucency for the reference materials.Comment: 20 pages (incl. appendices), 20 figures. Version with higher quality images: https://cloud-ext.igd.fraunhofer.de/s/pAMH67XjstaNcrF (main article) and https://cloud-ext.igd.fraunhofer.de/s/4rR5bH3FMfNsS5q (appendix). Supplemental material including code: https://cloud-ext.igd.fraunhofer.de/s/9BrZaj5Uh5d0cOU/downloa

    Development of a Spectral Acquisition System using a Tunable Light Source and a Trichromatic Camera

    No full text
    Colours perceived by humans are influenced by a large number of factors. The same object may look different under different lighting conditions. This is also true for images captured by a camera sensor. In addition to this, each measuring device has its own capturing properties. For example, the RGB intensities captured by different cameras are different for the same object in the same lighting conditions. To avoid these variations in the observed colour, it is necessary to know the ground truth of the colour data of the object, which is given by its spectral reflectance. In this thesis, we devise a method for estimating the spectral reflectances at a fast speed using a tunable monochromatic light source and a trichromatic camera. The estimation is a two-step process: first we need to determine the camera sensitivities, secondly, we use the estimated sensitivities to calculate the reflectances. For both experiments we use the same setup which allows us to use software application programming interfaces (APIs) to obtain reflectances for a large number of targets at an extreme speed and accuracy. For the evaluation ofour method, we employ a spectroradiometer which can directly measure the spectra of the targets

    An observer-metamerism sensitivity index for electronic displays

    No full text
    The effect of observer metamerism induced by electronic displays depends to a large extent on their primary spectra (red, green, and blue in the most common case). In particular, for narrow-band primary spectra whose peak wavelength lies in the range of high variability of the observer's colormatching function, some observers can experience very large differences between actual surface colors (e.g. in a light booth) and displayed colors if the monitor is optimized for the International Commission on Illumination (CIE) 1931 standard observer. However, because narrow-band light-emitting diodes lead to larger color gamuts, more and more monitors with very narrow band primaries are coming onto the market without manufacturers taking into account the associated problem of observer variations. Being able to measure these variations accurately and efficiently is therefore an important objective. In this paper, we propose a new approach to predict the extent of observer metamerism for a particular multiprimary display. Unlike existing dedicated models, ours does not depend on a reference illuminant and a set of reflectance spectra and is computationally more efficient

    3D printing spatially varying color and translucency

    No full text
    We present an efficient and scalable pipeline for fabricating full-colored objects with spatially-varying translucency from practical and accessible input data via multi-material 3D printing. Observing that the costs associated with BSSRDF measurement and processing are high, the range of 3D printable BSSRDFs are severely limited, and that the human visual system relies only on simple high-level cues to perceive translucency, we propose a method based on reproducing perceptual translucency cues. The input to our pipeline is an RGBA signal defined on the surface of an object, making our approach accessible and practical for designers. We propose a framework for extending standard color management and profiling to combined color and translucency management using a gamut correspondence strategy we call opaque relative processing. We present an efficient streaming method to compute voxel-level material arrangements, achieving both realistic reproduction of measured translucent materials and artistic effects involving multiple fully or partially transparent geometries

    Visual perception of 3D printed translucent objects

    No full text
    In order to reproduce translucent objects by 3D printers employing fully transparent (or clear) material, modeling the human visual perception of translucency is crucial. In this preliminary study, a set of 256 texture-less samples was created by mixing white and clear materials using multi-jet 3D printing. The samples differ in both lateral light transport properties and transmittance. Two psychophysical experiments were conducted to reveal the relationship between transmittance and a perceptually uniform scale for translucency. The results show that Stevens' power law describes well this relationship within the optically thin range of samples. Furthermore, the sensitivity to lateral light transport is small compared to transmittance for the texture-less sample set

    A method for joint color and translucency 3D printing and a joint color and translucency 3D printing device

    No full text
    A method and a device for three-dimensional joint color and translucency printing, wherein at least a first non-transparent printing material with a first printing material color and at least one transparent printing material (6) is used to construct a printing object (7), wherein an arrangement of the printing materials is determined based on a desired color reproduction of the printing object (7), wherein the arrangement of the printing materials is determined based on a desired translucency reproduction of the printing object (7)

    Redefining A in RGBA: Towards a Standard for Graphical 3D Printing

    Get PDF
    Advances in multimaterial 3D printing have the potential to reproduce various visual appearance attributes of an object in addition to its shape. Since many existing 3D file formats encode color and translucency by RGBA textures mapped to 3D shapes, RGBA information is particularly important for practical applications. In contrast to color (encoded by RGB), which is specified by the object’s reflectance, selected viewing conditions, and a standard observer, translucency (encoded by A) is neither linked to any measurable physical nor perceptual quantity. Thus, reproducing translucency encoded by A is open for interpretation. In this article, we propose a rigorous definition for A suitable for use in graphical 3D printing, which is independent of the 3D printing hardware and software, and which links both optical material properties and perceptual uniformity for human observers. By deriving our definition from the absorption and scattering coefficients of virtual homogenous reference materials with an isotropic phase function, we achieve two important properties. First, a simple adjustment of A is possible, which preserves the translucency appearance if an object is rescaled for printing. Second, determining the value of A for a real (potentially non-homogenous) material, can be achieved by minimizing a distance function between light transport measurements of this material and simulated measurements of the reference materials. Such measurements can be conducted by commercial spectrophotometers used in graphic arts. Finally, we conduct visual experiments employing the method of constant stimuli, and we derive from them an embedding of A into a nearly perceptually uniform scale of translucency for the reference materials

    End-to-end Color 3D Reproduction of Cultural Heritage Artifacts: Roseninsel Replicas

    No full text
    Planning exhibitions of cultural artifacts is always challenging. Artifacts can be very sensitive to the environment and therefore their display can be risky. One way to circumvent this is to build replicas of these artifacts. Here, 3D digitization and reproduction, either physical via 3D printing or virtual, using computer graphics, can be the method of choice. For this use case we present a workflow, from photogrammetric acquisition in challenging environments to representation of the acquired 3D models in different ways, such as online visualization and color 3D printed replicas. This work can also be seen as a first step towards establishing a workflow for full color end-to-end reproduction of artifacts. Our workflow was applied on cultural artifacts found around the “Roseninsel” (Rose Island), an island in Lake Starnberg (Bavaria), in collaboration with the Bavarian State Archaeological Collection in Munich. We demonstrate the results of the end-to-end reproduction workflow leading to virtual replicas (online 3D visualization, virtual and augmented reality) and physical replicas (3D printed objects). In addition, we discuss potential optimizations and briefly present an improved state-of-the-art 3D digitization system for fully autonomous acquisition of geometry and colors of cultural heritage objects
    corecore